首页> 外文OA文献 >Evolutionary Training of Sparse Artificial Neural Networks: A Network Science Perspective
【2h】

Evolutionary Training of Sparse Artificial Neural Networks: A Network Science Perspective

机译:稀疏人工神经网络的进化训练:网络   科学视角

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Through the success of deep learning, Artificial Neural Networks (ANNs) areamong the most used artificial intelligence methods nowadays. ANNs have led tomajor breakthroughs in various domains, such as particle physics, reinforcementlearning, speech recognition, computer vision, and so on. Taking inspirationfrom the network properties of biological neural networks (e.g. sparsity,scale-freeness), we argue that (contrary to general practice) Artificial NeuralNetworks (ANN), too, should not have fully-connected layers. We show how ANNsperform perfectly well with sparsely-connected layers. Following a Darwinianevolutionary approach, we propose a novel algorithm which evolves an initialrandom sparse topology (i.e. an Erd\H{o}s-R\'enyi random graph) of twoconsecutive layers of neurons into a scale-free topology, during the ANNtraining process. The resulting sparse layers can safely replace thecorresponding fully-connected layers. Our method allows to quadratically reducethe number of parameters in the fully conencted layers of ANNs, yieldingquadratically faster computational times in both phases (i.e. training andinference), at no decrease in accuracy. We demonstrate our claims on twopopular ANN types (restricted Boltzmann machine and multi-layer perceptron), ontwo types of tasks (supervised and unsupervised learning), and on 14 benchmarkdatasets. We anticipate that our approach will enable ANNs having billions ofneurons and evolved topologies to be capable of handling complex real-worldtasks that are intractable using state-of-the-art methods.
机译:通过深度学习的成功,人工神经网络(ANN)成为当今最常用的人工智能方法之一。人工神经网络已经在各个领域取得了重大突破,例如粒子物理学,强化学习,语音识别,计算机视觉等。从生物神经网络的网络特性(例如稀疏性,无标度)中获得启发,我们认为(与一般实践相反)人工神经网络(ANN)也不应具有完全连接的层。我们展示了人工神经网络如何在稀疏连接的层上完美地表现。遵循达尔文进化论方法,我们提出了一种新颖的算法,该算法在ANN训练过程中将两个连续神经元层的初始随机稀疏拓扑(即Erd \ H {o} s-R \'enyi随机图)演化为无标度拓扑。所得的稀疏层可以安全地替换相应的全连接层。我们的方法允许二次减少神经网络的完全连接层中的参数数量,在两个阶段(即训练和推理)中产生大约两倍的计算时间,而准确性不会降低。我们证明了我们对两种流行的ANN类型(受限的Boltzmann机器和多层感知器),两种任务类型(有监督和无监督学习)以及14种基准数据集的主张。我们预计,我们的方法将使具有数十亿个神经元和已发展拓扑的ANN能够处理使用最新方法难以处理的复杂实际任务。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号